Goto

Collaborating Authors

 parameter llm


A List of 1 Billion+ Parameter LLMs

#artificialintelligence

There are already over 50 different 1B+ parametersLLMs accessible via open-source checkpoints or proprietary APIs. That’s not counting any private models or models with academic papers but no available API or model weights. There’s even more if you count fine-tuned models like Alpaca or InstructGPT. A list of the ones I know about (this is an evolving document). GPT-J (6B) (EleutherAI) GPT-Neo (1.3B, 2.7B, 20B) (EleutherAI) Pythia (1B, 1.4B, 2.8B, 6.9B, 12B) Polyglot (1.3B, 3.8B, 5.8B) J1 (